This article was originally published by Autodesk's Redshift publication as "Next-Gen Virtual Reality Will Let You Create From Scratch—Right Inside VR."
The architecture and manufacturing industries are about to undergo a radical shift in how they make things. In the near future, designers and engineers will be able to create products, buildings, and cities in real time, in virtual reality (VR).
In predicting VR’s dramatic evolution, an analogy to early cinematic history is apt: As one legend has it, when the motion-picture camera first came out, actors were filmed on a set, in front of fake trees. Then someone said, “Why don’t you just put the camera in the forest?” Simple, but game-changing. VR technology is already available, and it’s only a matter of time before it is used to its full potential.
What’s Here Now: Visualization
At a dedicated VR station inside the Los Angeles office of John A. Martin & Associates, where I am a Building Information Modeling (BIM) director, colleagues strap on eye-tracking headsets and navigate using handheld controllers through 3D models created by BIM software. Visualizing a design in this context lets users detect structural irregularities they might otherwise miss.
For example, in VR, you can see if a beam is not properly connected to a girder. Sure, this is possible without a VR headset, but being completely immersed in a 3D environment makes you feel as if you are standing in that actual physical spot. It’s easier to detect building components that are not in the correct location.
VR has made great strides as a visualization tool—its dominant use in the architecture, engineering, and construction industries—both within firms and for use with clients. Using handheld laser point-and-click controllers, engineers and designers can move through 3D building renderings as if they’re in a first-person video game simulation. They can float up staircases, teleport down hallways, or peer out of upper-story windows. It is truly amazing.
Design visualizations can also help firms sell ideas to stakeholders. By deploying 3D building models as playable “games” with VR-capable software such as Revit Live, 3ds Max, and Enscape, designers can invite clients and owners into immersive showcases of their prospective projects.
What’s Coming: Creation
Still, these examples only scratch the surface of VR’s potential. The next big opportunity for designers and engineers will move beyond visualization to actually creating structures and products from scratch in VR.
Imagine VR for Revit: What if you could put on an eye-tracking headset and, with the movement of your hands and wrists, grab a footing, scale a model, lay it out, push it, spin it, and change its shape?
That scenario may not be far off. Programs like Google Tilt Brush, which lets you paint in a 3D VR environment, could signal what’s coming for creating design projects in VR. Simply by rotating your wrist in the painting tool, you can color an object in a VR environment. That kind of physically responsive design functionality is not available in the VR platforms used by most architecture and manufacturing firms, but its existence outside the industries suggests it could migrate.
There are 3D mesh and surface modelers that allow designers to form smoothly curved, organic shapes—car bodies, canopies, and the like—but they are made on a 2D screen using tedious mouse movements and keyboard commands. To manipulate nodes and lines, users pull and drag cursors—a clumsy way of doing things in an age of VR.
If designers could create directly in VR, rather than using external desktop software, they could peer around rear walls and teleport to tight spots, such as joints and moldings. By working at a closer, more maneuverable range to objects, designers could create more organic shapes with a higher level of granular detail. Artists and artisans learned a long time ago to use their hands to sculpt with stone and clay—and while that ability doesn’t directly apply to the realities of designing things like buildings and cars, there’s an opportunity to bring it back in a virtual way.
What Needs to Change: Interactivity
Before VR will see widespread adoption as a creation tool in the architecture and manufacturing industries, the software must make a significant leap forward. As it stands, most game-engine technology allows users to only look around, not touch objects or edit on the fly. For example, if you are viewing your model in VR and you want to make a beam correction, you must take the headset off, set it down, find the beam in the authoring software, make the change with a mouse and keyboard, update the model in the game-engine viewer, put the headset back on, and make sure the change happened. That workflow is long and tedious.
The future of VR needs to move beyond taking the VR headset off and relying on mouse-and-keyboard clicks to make changes. Architecture and manufacturing design software should take full advantage of VR’s handheld controllers and immersive environment, as well as provide tools within the experience to interact with and make changes to 3D models.
Another stumbling block is the lack of automated interactivity inside VR. Any action a user might take in VR—move a beam, open a window, or turn on a light—must be preprogrammed by an experienced game-engine programmer to make it interactive. A better solution could be to automate this process. For example, the Revit 3D model could automatically be converted into a game-engine environment that is VR capable, with interactivity already programmed in, so anytime a user wants to move a wall, open a door, or flex any type of component within the VR environment, it’s possible.
Information modeling is like a living, breathing thing: A building, door, window, table, or piece of medical equipment all have flexibility in their parameters. In most game-engine based technologies used today, these elements are static—for now. VR is about evolve. Are you ready?